翻訳と辞書
Words near each other
・ Quantum process
・ Quantum programming
・ Quantum pseudo-telepathy
・ Quantum Psychology
・ Quantum q-Krawtchouk polynomials
・ Quantum Quality Productions
・ Quantum radar
・ Quantum readout of PUFs
・ Quantum Reality
・ Quantum realm
・ Quantum Redshift
・ Quantum refereed game
・ Quantum reference frame
・ Quantum reflection
・ Quantum register
Quantum relative entropy
・ Quantum revival
・ Quantum rotor model
・ Quantum satis
・ Quantum sensor
・ Quantum Sheep
・ Quantum simulator
・ Quantum singularity
・ Quantum solid
・ Quantum solvent
・ Quantum sort
・ Quantum spacetime
・ Quantum spin Hall effect
・ Quantum spin liquid
・ Quantum spin model


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Quantum relative entropy : ウィキペディア英語版
Quantum relative entropy
In quantum information theory, quantum relative entropy is a measure of distinguishability between two quantum states. It is the quantum mechanical analog of relative entropy.
== Motivation ==

For simplicity, it will be assumed that all objects in the article are finite-dimensional.
We first discuss the classical case. Suppose the probabilities of a finite sequence of events is given by the probability distribution ''P'' = , but somehow we mistakenly assumed it to be ''Q'' = . For instance, we can mistake an unfair coin for a fair one. According to this erroneous assumption, our uncertainty about the ''j''-th event, or equivalently, the amount of information provided after observing the ''j''-th event, is
:\; - \log q_j.
The (assumed) average uncertainty of all possible events is then
:\; - \sum_j p_j \log q_j.
On the other hand, the Shannon entropy of the probability distribution ''p'', defined by
:\; - \sum_j p_j \log p_j,
is the real amount of uncertainty before observation. Therefore the difference between these two quantities
:\; - \sum_j p_j \log q_j - \left(- \sum_j p_j \log p_j\right) = \sum_j p_j \log p_j - \sum_j p_j \log q_j
is a measure of the distinguishability of the two probability distributions ''p'' and ''q''. This is precisely the classical relative entropy, or Kullback–Leibler divergence:
:D_ \!.
Note
#In the definitions above, the convention that 0·log 0 = 0 is assumed, since lim''x'' → 0 ''x'' log ''x'' = 0. Intuitively, one would expect that an event of zero probability to contribute nothing towards entropy.
#The relative entropy is not a metric. For example, it is not symmetric. The uncertainty discrepancy in mistaking a fair coin to be unfair is not the same as the opposite situation.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Quantum relative entropy」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.